A novel stochastic Hebb-like learning rule for neural networks
نویسنده
چکیده
We present a novel stochastic Hebb-like learning rule for neural networks. This learning rule is stochastic with respect to the selection of the time points when a synaptic modification is induced by preand postsynaptic activation. Moreover, the learning rule does not only affect the synapse between preand postsynaptic neuron which is called homosynaptic plasticity but also on further remote synapses of the preand postsynaptic neuron. This form of plasticity has recently come into the light of interest of experimental investigations and is called heterosynaptic plasticity. Our learning rule gives a qualitative explanation of this kind of synaptic modification.
منابع مشابه
A Heterosynaptic Learning Rule for Neural Networks
In this article we intoduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pr...
متن کاملActive Learning in Recurrent Neural Networks Facilitated by a Hebb-like Learning Rule with Memory
We demonstrate in this article that a Hebb-like learning rule with memory paves the way for active learning in the context of recurrent neural networks. We compare active with passive learning and a Hebb-like learning rule with and without memory for the problem of timing to be learned by the neural network. Moreover, we study the influence of the topology of the recurrent neural network. Our r...
متن کاملBidirectional communication in neural networks moderated by a Hebb-like learning rule
We demonstrate that our recently introduced stochastic Hebb-like learning rule [7] is capable of learning the problem of timing in general network topologies generated by an algorithm of Watts and Strogatz [20]. We compare our results with a learning rule proposed by Bak and Chialvo [2, 4] and obtain not only a significantly better convergence behavior but also a dependence of the presentation ...
متن کاملTemporal Hebbian Learning in Rate-Coded Neural Networks: A Theoretical Approach towards Classical Conditioning
A novel approach for learning of temporally extended, continuous signals is developed within the framework of rate coded neurons. A new temporal Hebb like learning rule is devised which utilizes the predictive capabilities of bandpass filtered signals by using the derivative of the output to modify the weights. The initial development of the weights is calculated analytically applying signal th...
متن کاملA generative model for Spike Time Dependent Hebbian Plasticity
Based on neurophysiological observations on the behavior of synapses, Spike Time Dependent Hebbian Plasticity (SDTHP) is a novel extension to the modeling of the Hebb Rule. This rule has enormous importance in the learning of Spiking Neural Networks (SNN) but its mecanisms and computational properties are still to be explored. Here, we present a generative model for SDTHP based on a simplified ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003